18 research outputs found

    Multi-Criteria Analysis in Compound Decision Processes. The AHP and the Architectural Competition for the Chamber of Deputies in Rome (Italy)

    Get PDF
    In 1967, a national architectural competition was released for a preliminary project proposal, aimed at the realization of the new building for the Chamber of Deputies in Rome. The outcomes of that competition were unusual: eighteen projects were declared joint winners, and no winner was consequently selected. With reference to that event, this research aims to examine the usefulness of the evaluation tools that are currently employed and the positive effects that one of these techniques would have had, as support for the identification of the “winner” project, are highlighted. Therefore, an hypothetical examination/adjustment of the decision process of that competition through the Analytic Hierarchy Process (AHP) is developed, analyzing the outputs obtained by the implementations of this technique on the final decision. In addition to confirming the usefulness of the evaluation tools for compound and conflicting decision processes, the results of this experiment led to a further understanding of the socio-cultural dynamics related to the original outcomes of the competition analyzed. View Full-Tex

    Regularized Least Squares Cancer Classifiers from DNA microarray data

    Get PDF
    BACKGROUND: The advent of the technology of DNA microarrays constitutes an epochal change in the classification and discovery of different types of cancer because the information provided by DNA microarrays allows an approach to the problem of cancer analysis from a quantitative rather than qualitative point of view. Cancer classification requires well founded mathematical methods which are able to predict the status of new specimens with high significance levels starting from a limited number of data. In this paper we assess the performances of Regularized Least Squares (RLS) classifiers, originally proposed in regularization theory, by comparing them with Support Vector Machines (SVM), the state-of-the-art supervised learning technique for cancer classification by DNA microarray data. The performances of both approaches have been also investigated with respect to the number of selected genes and different gene selection strategies. RESULTS: We show that RLS classifiers have performances comparable to those of SVM classifiers as the Leave-One-Out (LOO) error evaluated on three different data sets shows. The main advantage of RLS machines is that for solving a classification problem they use a linear system of order equal to either the number of features or the number of training examples. Moreover, RLS machines allow to get an exact measure of the LOO error with just one training. CONCLUSION: RLS classifiers are a valuable alternative to SVM classifiers for the problem of cancer classification by gene expression data, due to their simplicity and low computational complexity. Moreover, RLS classifiers show generalization ability comparable to the ones of SVM classifiers also in the case the classification of new specimens involves very few gene expression levels

    Comparative study of gene set enrichment methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The analysis of high-throughput gene expression data with respect to sets of genes rather than individual genes has many advantages. A variety of methods have been developed for assessing the enrichment of sets of genes with respect to differential expression. In this paper we provide a comparative study of four of these methods: Fisher's exact test, Gene Set Enrichment Analysis (GSEA), Random-Sets (RS), and Gene List Analysis with Prediction Accuracy (GLAPA). The first three methods use associative statistics, while the fourth uses predictive statistics. We first compare all four methods on simulated data sets to verify that Fisher's exact test is markedly worse than the other three approaches. We then validate the other three methods on seven real data sets with known genetic perturbations and then compare the methods on two cancer data sets where our a priori knowledge is limited.</p> <p>Results</p> <p>The simulation study highlights that none of the three method outperforms all others consistently. GSEA and RS are able to detect weak signals of deregulation and they perform differently when genes in a gene set are both differentially up and down regulated. GLAPA is more conservative and large differences between the two phenotypes are required to allow the method to detect differential deregulation in gene sets. This is due to the fact that the enrichment statistic in GLAPA is prediction error which is a stronger criteria than classical two sample statistic as used in RS and GSEA. This was reflected in the analysis on real data sets as GSEA and RS were seen to be significant for particular gene sets while GLAPA was not, suggesting a small effect size. We find that the rank of gene set enrichment induced by GLAPA is more similar to RS than GSEA. More importantly, the rankings of the three methods share significant overlap.</p> <p>Conclusion</p> <p>The three methods considered in our study recover relevant gene sets known to be deregulated in the experimental conditions and pathologies analyzed. There are differences between the three methods and GSEA seems to be more consistent in finding enriched gene sets, although no method uniformly dominates over all data sets. Our analysis highlights the deep difference existing between associative and predictive methods for detecting enrichment and the use of both to better interpret results of pathway analysis. We close with suggestions for users of gene set methods.</p

    Promoter methylation correlates with reduced NDRG2 expression in advanced colon tumour

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Aberrant DNA methylation of CpG islands of cancer-related genes is among the earliest and most frequent alterations in cancerogenesis and might be of value for either diagnosing cancer or evaluating recurrent disease. This mechanism usually leads to inactivation of tumour-suppressor genes. We have designed the current study to validate our previous microarray data and to identify novel hypermethylated gene promoters.</p> <p>Methods</p> <p>The validation assay was performed in a different set of 8 patients with colorectal cancer (CRC) by means quantitative reverse-transcriptase polymerase chain reaction analysis. The differential RNA expression profiles of three CRC cell lines before and after 5-aza-2'-deoxycytidine treatment were compared to identify the hypermethylated genes. The DNA methylation status of these genes was evaluated by means of bisulphite genomic sequencing and methylation-specific polymerase chain reaction (MSP) in the 3 cell lines and in tumour tissues from 30 patients with CRC.</p> <p>Results</p> <p>Data from our previous genome search have received confirmation in the new set of 8 patients with CRC. In this validation set six genes showed a high induction after drug treatment in at least two of three CRC cell lines. Among them, the N-myc downstream-regulated gene 2 (<it>NDRG2) </it>promoter was found methylated in all CRC cell lines. <it>NDRG2 </it>hypermethylation was also detected in 8 out of 30 (27%) primary CRC tissues and was significantly associated with advanced AJCC stage IV. Normal colon tissues were not methylated.</p> <p>Conclusion</p> <p>The findings highlight the usefulness of combining gene expression patterns and epigenetic data to identify tumour biomarkers, and suggest that NDRG2 silencing might bear influence on tumour invasiveness, being associated with a more advanced stage.</p

    The commissioning of the CUORE experiment: the mini-tower run

    Get PDF
    CUORE is a ton-scale experiment approaching the data taking phase in Gran Sasso National Laboratory. Its primary goal is to search for the neutrinoless double-beta decay in 130Te using 988 crystals of tellurim dioxide. The crystals are operated as bolometers at about 10 mK taking advantage of one of the largest dilution cryostat ever built. Concluded in March 2016, the cryostat commissioning consisted in a sequence of cool down runs each one integrating new parts of the apparatus. The last run was performed with the fully configured cryostat and the thermal load at 4 K reached the impressive mass of about 14 tons. During that run the base temperature of 6.3 mK was reached and maintained for more than 70 days. An array of 8 crystals, called mini-tower, was used to check bolometers operation, readout electronics and DAQ. Results will be presented in terms of cooling power, electronic noise, energy resolution and preliminary background measurements

    Results from the Cuore Experiment

    Get PDF
    The Cryogenic Underground Observatory for Rare Events (CUORE) is the first bolometric experiment searching for neutrinoless double beta decay that has been able to reach the 1-ton scale. The detector consists of an array of 988 TeO2 crystals arranged in a cylindrical compact structure of 19 towers, each of them made of 52 crystals. The construction of the experiment was completed in August 2016 and the data taking started in spring 2017 after a period of commissioning and tests. In this work we present the neutrinoless double beta decay results of CUORE from examining a total TeO2 exposure of 86.3kg yr, characterized by an effective energy resolution of 7.7 keV FWHM and a background in the region of interest of 0.014 counts/ (keV kg yr). In this physics run, CUORE placed a lower limit on the decay half- life of neutrinoless double beta decay of 130Te > 1.3.1025 yr (90% C. L.). Moreover, an analysis of the background of the experiment is presented as well as the measurement of the 130Te 2vo3p decay with a resulting half- life of T2 2. [7.9 :- 0.1 (stat.) :- 0.2 (syst.)] x 10(20) yr which is the most precise measurement of the half- life and compatible with previous results

    Evaluation, financing, planning and design of contemporary urban interventions

    No full text
    The processes of transformation about urban settlements involve individual buildings, but also portions of consolidated urban tissue which characterize the structure and the form of minor historical centers as well as large urban centers. These processes are usually related to the objectives of enhancing the physical, economic and social aspects of the urban changes. The technical, financial, procedural and managerial interventions should be devel- oped and implemented focusing on specific objectives that justify the need and pro- mote the opportunity to gain funding. Nowadays funding performs a critical role to plan, design, implement, and possibly manage during operational phases. All the interventions must be coherent with the ter- ritorial “vocations” and satisfy the real needs of the individual and the whole community. Over time, different assessment tools have been developed in order to check the congruence between choices about the “project” (technical, financial, procedural and managerial) and objectives of planning interventions, often related to the different fund- ing opportunities for interventions. This paper wants to build a framework concerning evaluation procedures that can be taken, even related to the different ways of financing. Moreover, the aim of this critical reflection is to highlight how the different choices about the project, operational phases and management must necessarily try to obtain the maximum benefit related to the least expenditure of resources, according to the objectives to be pursued with fundin
    corecore